The software engineering community has defined a number of metrics that can be quantify aspects of the design of an object-oriented software systems. Measurements that fall outside of normal ranges point to potential quality problems within the code base. The information provided by the metrics is not absolute but indicates areas that the team should analyze more carefully with an eye toward making refactoring and other design improvements. Typicallly, IDEs incorporate metric suites and make it very easy for the entire set of measurements to be calculated on the project code base.
Learning Outcomes
Explain the different classes of metrics that are defined for object-oriented software systems
Describe the measurements being made by object-oriented code metrics
Analyze the design and implementation of a software system based on code metrics
Study Resources
For your study of this topic, use these resources.
, in Essentials of Software Engineering, Third Edition by Frank Tsui, Orlando Karam and Barbara Bernal available on Skillsoft through the RIT library
in Software Engineering: A Hands-On Approach by Roger Y. Lee available on Skillsoft through the RIT library
in Software Quality Assurance: In Large Scale and Complex Software-Intensive Systems by Ivan Mistrik, Richard Soley, Nour Ali, John Grundy and Bedir Tekinerdogan (eds) available on Skillsoft through the RIT library.
in Handbook of Software Quality Assurance, Fourth Edition by G. Gordon Schulmeyer (ed) available on Skillsoft through the RIT library
Take 2 screen shots to validate your installation:
SonarQube running from your browser
SonarScanner running from your terminal. Verify your installation by opening a new shell and executing the command sonarscanner -h (or if failing try sonar-scanner -h (likely also works on Mac) ; or even sonar-scanner.bat -h or sonar-scanner.cmd -h on Windows CMD/Powershell). Make sure that your screen grab shows the entire shell window you are using, with your command, so we can validate what works on what system. *If you would like, you can even help us by copy/pasting this info on the comment area of your submission (e.g. On Mac, copy/paste results of sw_vers and report "sonar-scanner -h works on Mac version: xyz) Please notify your instructor in advance if none of the above work for you. Be sure to provide details of your environment/shell/versions to facilitate getting help.
By the date specified on the schedule for your section, deposit the image(s) in the myCourses Assignments folder, Static Code Analysis Tool Setup - individual.
By the date specified on the schedule for your section, take a screen shot of the initial complexity metric analysis and deposit the image in the myCourses Assignments folder, Static Code Analysis - individual.
After-Class Exercises
Static Code Analysis - team:
Continue your team's exploration and analysis of the code metrics as described in the Static Code Analysis Exercise and Project Analysis. This is a good practice to carry out as your move forward. That said, the documentation that you submit for Sprint 4 will include your in-depth analysis of the code metrics for your project and recommendations for redesign and reimplementation of areas of your project in the hotspots that the code metrics indicate. If you act upon these prior to the end of your implementation just make sure to keep a chronology of screen-grabs so you can reflect and demonstrate how/what progress was made.